Search Results: "awm"

17 October 2008

Andrew McMillan: Squid packages with IPv6 support enabled

I've been helping Amos Jeffries with a little testing in the last week to help nail some IPv6 bugginess preparatory to the upcoming 3.1 release of squid. In the process of that I've built some Squid3 packages with IPv6 support enabled from current HEAD. Get 'em while they're hot. Note that these are in a 'works for me' state. They have been built on Lenny, and I have them running on both Lenny and Sid. I haven't put them somewhere you could apt-get them from because you should be paying attention if you're going to use them! PS. If you can't click through to Amos' site it's because you're using IPv6 and the EveryDNS servers are continuing to serve up old data for his domain. Sigh.

11 October 2008

Andrew McMillan: Shiny New Laptop

After a few years of only buying laptops with Intel hardware, today I bought something totally different. It's not really what I wanted (which was an HP HDX 16t) but I get the feeling that none of these 16" HD 1080 laptops will make it to New Zealand for a while yet, and the NZ dollar has done such a nosedive recently that it's better not to wait any longer. In the places that hold stock there seem to be some good specials around at the moment, and as the owner of a new free, open-source consulting business (i.e: a cheap bastard) I went shopping for the cheapest dual-core I could find with a half-decent screen, and I found the Asus X53K for $999 (USD$589) at Dick Smith, including a 2G ram upgrade to take it to 3G. It's entirely non-intel, with a 2GHz Turion dual-core, ATI Radeon X2300 with 1440x900 panel, Atheros AR2425 wifi and 160G HD. I'd bought a replacement 320G hard drive even before I got the laptop, so now I have a pristine, unbooted 160G hard drive with the install files for some other OS on it - no doubt I'll find a use for the disk, at least! Since AMD got ATI to release all their chip documentation earlier this year I felt able to shell out for this, rather than the extra $100 for the model next to it, and it was nice too to get home and find that Atheros have recently released the HAL for their a/b/g chips. Which presumably means that they haven't done so for their 'n' chipsets, and I should continue to steer clear of that technology for a while yet... I'm running Debian GNU/Linux 'Sid' on the Asus X53K and, everything pretty much just works out of the box. My installation process was to rsync the old laptop onto a new disk, and boot the new laptop from that - after compiling a new kernel more appropriate to the changed hardware. After overcoming my own stupidity in not syncing the /dev/ underneath udev, which I easily googled my way out of, the only problem I've found so far is that the free radeon driver doesn't do 3d for me. Presumably the non-free ones would, but they won't compile against my 2.6.27 kernel so I don't know for sure. Fortunately I don't use 3d for anything so it's not a huge inconvenience to me. With 3G RAM and a fast 320G hard drive the laptop actually is an upgrade for me, too, and it has a webcam too, which I expect I'll look at in much the same way as I did the fingerprint reader on the old laptop. It will be good to finally hand that old one back to Catalyst, too, who have given me the flexibility to take my time on this. Now to try and peel off all these stickers without damaging anything!

5 October 2008

Andrew McMillan: Apology Accepted

It is nice to see someone apologising for their planned failure to consider Linux users. It's ridiculous that they even have to. It seems to me that these people have spent way too much effort on making the logo and menus scroll in from the left and right of the screen, and not enought effort on the actual functionality of their website. I fail to understand what benefit they have gained from using the Pizza UI for their logo & menus (yes, really) rather than using simple links - or CSS-based menus, if they needed fancy. The page layout doesn't actually need anything more than simple text links. The logo (thankfully) does nothing after it's page-load scroll. For extra 'fail' marks they substitute graphics when I initially arrive with Javascript disabled (and wearing my tinfoil hat) but the graphics give me the appearance of a menu without actually performing a useful function. The user-interface situation gets worse later, when I am taken through five successive screens to upload and classify some new media. It seems that in order to bore people who are foolishly trying to upload their home movies over "poorband" they attempt to display a progress bar. A better approach would be to ask me all of the questions about the media I am uploading on one single page and let me get on with something useful. This might be news, but it seems that such an interface could be done in plain HTML, and wouldn't actually require me to enable javascript for a site which is riddled with cross-site scripting flaws. Since it's the school holidays at the moment we took advantage of Te Papa's excellent institution of "Late Night Thursday" last week and visited the other end of the website in person. My disappointment in the website was then entirely eclipsed by my disappointment in the real-life part of the exhibit. The "walk on the map and see photos of the area" component was decidedly clunky, with no clear feedback between stepping on a tile and media appearing on the wall In fact it seems most tiles don't actually do anything, and there are only a few small screens in the walls to display the images. The "postcards blowing in the wind" gimmick amused the kids for about 20 seconds, tops. The 'wall' was quite a lot more successful, amusing me and the kids for nearly half an hour, though we're unlikely to go back next time we're there. There were plenty of annoyances also, in particular that it doesn't appear to use the full resolution of uploaded photos, the interface is slow and it is difficult to see what you are doing at times. It's good that they've started practising their apologies, firstly targeting the small (but geeky :-) segment of New Zealanders who are Linux users trying to help them out by providing them with free content. Now I am looking forward to their further apologies to larger groups of justifiably annoyed people, culminating with the ultimate one, where they apologise to all of the taxpayers of New Zealand who funded this incredible waste of money. I can imagine that something interesting could be done with a lot of media of New Zealand and now that Te Papa has started to amass a collection in this way, perhaps they will.

Andrew McMillan: Apology Accepted

It is nice to see someone apologising for their planned failure to consider Linux users. It's ridiculous that they even have to. It seems to me that these people have spent way too much effort on making the logo and menus scroll in from the left and right of the screen, and not enought effort on the actual functionality of their website. I fail to understand what benefit they have gained from using the Pizza UI for their logo & menus (yes, really) rather than using simple links - or CSS-based menus, if they needed fancy. The page layout doesn't actually need anything more than simple text links. The logo (thankfully) does nothing after it's page-load scroll. For extra 'fail' marks they substitute graphics when I initially arrive with Javascript disabled (and wearing my tinfoil hat) but the graphics give me the appearance of a menu without actually performing a useful function. The user-interface situation gets worse later, when I am taken through five successive screens to upload and classify some new media. It seems that in order to bore people who are foolishly trying to upload their home movies over "poorband" they attempt to display a progress bar. A better approach would be to ask me all of the questions about the media I am uploading on one single page and let me get on with something useful. This might be news, but it seems that such an interface could be done in plain HTML, and wouldn't actually require me to enable javascript for a site which is riddled with cross-site scripting flaws. Since it's the school holidays at the moment we took advantage of Te Papa's excellent institution of "Late Night Thursday" last week and visited the other end of the website in person. My disappointment in the website was then entirely eclipsed by my disappointment in the real-life part of the exhibit. The "walk on the map and see photos of the area" component was decidedly clunky, with no clear feedback between stepping on a tile and media appearing on the wall In fact it seems most tiles don't actually do anything, and there are only a few small screens in the walls to display the images. The "postcards blowing in the wind" gimmick amused the kids for about 20 seconds, tops. The 'wall' was quite a lot more successful, amusing me and the kids for nearly half an hour, though we're unlikely to go back next time we're there. There were plenty of annoyances also, in particular that it doesn't appear to use the full resolution of uploaded photos, the interface is slow and it is difficult to see what you are doing at times. It's good that they've started practising their apologies, firstly targeting the small (but geeky :-) segment of New Zealanders who are Linux users trying to help them out by providing them with free content. Now I am looking forward to their further apologies to larger groups of justifiably annoyed people, culminating with the ultimate one, where they apologise to all of the taxpayers of New Zealand who funded this incredible waste of money. I can imagine that something interesting could be done with a lot of media of New Zealand and now that Te Papa has started to amass a collection in this way, perhaps they will.

31 August 2008

Andrew McMillan: Leaving Catalyst

After 11 years 1 month and 28 days it's time for me to farewall Catalyst IT. While this is something that I've been working towards for the last couple of years, my reasons for leaving don't reflect any large dissatisfaction with Catalyst, but rather my own disinterest in fulfillling a role which is deemed appropriate to an executive director of the company it has become. As well, it is Catalyst's current and continuing success which provides me with the opportunity to fade out, like the cheshire cat, without the need to have strong plans. I do believe that the use and understanding of free and open-source software in New Zealand has matured to a point where there is the potential for an independent to make a few dollars reviewing or planning for open source projects in corporate and government areas. I hope I'll find out. And a great big thank you to every one of you staff, clients, suppliers and friends who has made Catalyst such a fantastic place to be a part of. It really will be a hard act to follow. :-)

7 May 2008

Andrew McMillan: Finally: DAViCal 0.9.5

Finally, I have released DAViCal 0.9.5. Hopefully this will resolve the series of installation- and upgrade- related problems which plagued the 0.9.4 release. Thanks for everyone being patient while this release was thoroughly tested through five pre-release versions, and especially thanks to those patient people who helped test those pre-releases. Now if I don't get too distracted by: ... then maybe I will be able to really concentrate on nailing the scheduling extensions work over the next couple of months... Wish me luck!

28 November 2007

Andrew McMillan: Which is the more interesting hackspace: X or Linux?

A curiously interesting thread has sprung up on the Xorg mailing list recently regarding the lack of bandwidth the developers might have for accepting patches. Bernardo Innocenti, who hacks on OLPC stuff amongst other things, wants to get some patches reviewed for acceptance, but there aren't enough people around with the time and inclination to review and accept his work. He did a quick and dirty analysis of Xorg LoC vs Linux LoC, to give a "X codebase is roughly 1/2 Linux codebase" (which surprised me, actually, I'd always thought it was larger) and then a similarly rough analysis that suggests that Xorg has roughly 1/20th of the developer base. To some extent Bernardo's metrics are arguable, but the basic facts are that Linux gets the lions share of the developers. Is this what we should expect? Is this a healthy way for the free and open source software community to support the development of what I personally would like to see as the future desktop base? Linux does a fantastic job as an OS, right now. It's been doing it for years, really, but we still need to get behind some of the differential further up the stack. And all credit to those stalwarts of X development who are shouldering more than their fair share of the burden. So if you were looking for an open source software project where you can be appreciated, don't overlook X.

4 November 2007

Daniel Burrows: Lessig at the University of Washington

On Friday 2007-11-02, I watched Larry Lessig give a presentation at the University of Washington entitled Is Google (2008) Microsoft (1998)? Since it was Lessig, the talk was articulate and thought-provoking, and he used his slides very well (unlike many presenters who just read bullet points).

The argument he made is that Google is like Microsoft in some ways (and so are other companies like Facebook). In his analysis, the problem with Microsoft was that so many companies made themselves Microsoft-dependent, which both forced other companies to follow suit (due to network effects) and gave Microsoft leverage they could use against everyone in their ecosystem (e.g., Netscape). According to Lessig, Microsoft was not evil to use this leverage, but you have to recognize that Microsoft will do what is best for Microsoft, and this may not be what is best for you. He then suggested that we should be aware of the potential for the same thing to happen; in support of this point, he quoted excerpts from the terms of service from the APIs of Google and Facebook, which contain typical statements like we reserve the right to terminate your use of this service for any reason or for no reason, and asked: What if Microsoft had written this license agreement?

But it seemed to me that this wasn't his main point. He was using this observation as a springboard to talk about issues revolving around competition and public policy. Specifically, he feels that we should work towards a more just model of interaction between companies and users whose work makes the company more valuable (by contributing to a software ecosystem, by posting content on the Web site, etc). But the correct way to do this (he says) is not via competition, after-the-fact litigation, or voluntary codes of conduct: competition may not be around to enforce good behavior (as we've seen with Microsoft), litigation may be too late and ineffective (again, see Microsoft), and voluntary codes of conduct will disappear as soon as they get in the way of the bottom line. The correct approach, he argued, is across-the-board regulation so that all the companies have to play by the same rules.

From there he went on to talk about governmental corruption. He said (quoting Robert Reich's book Supercapitalism) that the reason we consistently look for solutions in the market and in voluntary compliance is that our governmental system is broken and does not effectively regulate corporations in the public interest. But Lessig is optimistic that we can change things (he joked that his publisher was unhappy with this point of view, because his brand has been built on pessimism). In his view, the politicians in Washington, by and large, want to be honest and do good, but they aren't able to within the current political system. For instance, on the few occassions that he managed to get access to lawmakers to discuss copyright issues, it was often the first time they had heard that there was more than one side to the argument. He thinks we need a national political movement that will shame politicians into being less corrupt.

This is all a simplification of what he said, of course; unfortunately, I wasn't able to find a recording of this talk on Lessig's Web site, or I'd point you at primary source material. I don't know if this is because he chose not to post it, or because it just hasn't been made available yet.

My view

In general, I thought that the speech was very interesting and well thought out. But as much as I'd like to buy into his optimism that we can fix our system, I don't think it's well-founded, for two reasons.

First, he argued that competition will resolve some of these problems (that was before and after he said we can't rely on competition, which I found confusing and may indicate that I misunderstood something). His evidence for this seemed to be that when Microsoft was dominating the world, users responded by reducing their dependence on Microsoft products and by switching to alternatives such as Linux. But, in fact, although more people may be using Linux than in the past, virtually everyone is still only using Windows, it's still almost impossible to find a programming job that isn't Windows-only, and Microsoft is still raking in money hand over fist; indeed, they're doing so well that they're planning to expand their workforce by a third. To take just one data point of many: I ride the bus to Seattle and back, and I regularly see other people using laptops as they commute. Virtually all the laptops run Windows. Occasionally I'll see a Mac, and every few months I'll see a guy running Linux on his laptop; I say "a guy" because it's the same guy every single time. So by my estimation, use of Linux on laptops in buses on the 545 route from Redmond to Seattle is way below 1%, with Macintoshes somewhere around 5%.

To broaden my point, I think Lessig has fallen into a trap that highly intelligent people, particularly those in academia, tend to fall into. In academia, you are surrounded by people who value learning and thinking deeply about the world. These are people who will give serious and unbiased consideration to questions like what is the social effect of my acceptance of the Facebook Terms of Service? and If I become dependent on this service, will that possibly affect me in five years if the company decides to act against my interests? The problem with this is that these people are not representative of the population at large. The population at large thinks why is this check-box getting in the way of me sending pictures to my friends? And I have no doubt that any kid who, e.g., refuses to sign up for Facebook as a protest against their TOS will be roundly mocked in their social circles for being a weird antisocial nonconformist. (of course, this may not apply to the small minority of people like myself whose social circles consist of weird antisocial nonconformists)

Second, Lessig's optimism about the political system seemed to be based on his observation that the people in government are, by and large, not venally corrupt: they want to do good, but the system is constructed in a way that makes this impossible. To me, that's a tremendously disheartening statement. If our problem were simply a few, or even many, corrupt politicians, this would be a solvable problem: even nowadays, Americans have some ability to choose who gets elected, and I believe that a sufficiently well-coordinated campaign could replace bad politicians with good ones.

But if the problem is that the system is corrupt even when the individuals are honest, it's a much deeper problem and frankly one that I don't know how to solve. While individuals have some direct control over who gets elected, we have no way to directly change the system that produces the corruption; the levers to do this are in the hands of the politicians elected through that broken system. It's unlikely that the politicians themselves will fix the problem, because they've benefited deeply from it: if the system is left as it is, well over 90% of Congress members will keep
their jobs
in the next election cycle, and any effective change seems likely (by definition) to disrupt this cushy arrangement.

On the other hand, running new candidates for office will not fix the problem. After all, if the problem is that the system is corrupt even when the participants are honest, then putting more honest people into government will have no effect. In order to get elected, these new politicians will have to become just as corrupt as the old ones, because that's how the system is set up. So we might change the faces, but we'll have the same old problems.

I recall two specific concrete proposals that Lessig made to make the system more responsive. First, he suggested shaming politicians who engage in corrupt behavior -- presumably taking large donations from interested parties. I don't see how this will help. Americans generally have a very low regard for the political class as it is; will pointing out particular examples of politicians taking money really make a difference when it comes to votes? And if it doesn't, will it really produce any change in their behavior? Politicians are motivated by votes the same way corporations are motivated by money; unless they stand a chance of losing their job (which is virtually impossible in any event), there's no reason for them to change how they behave.

His second proposal was for a system of contributions where it's impossible for any individual to prove they had donated a particular amount of money to a particular candidate. Even assuming it could be implemented perfectly, I don't think this will really solve the problem, for two reasons. First, it's not generally a secret what moneyed interests want. Politicians who want to attract large donations can just take positions that they know will appeal to donors of that sort. The donors themselves can make this easier on the candidates by, e.g., posting public position statements on issues of the day. Secondly, and this is far more insidious, even if politicians honestly represent their positions, then because success in running for office is so tied to the amount of money the candidate can raise, only candidates who hold opinions favorable to large corporations and wealthy individuals will be able to get elected. In fact, for all I know this is what happens already!

But with all that said, it was a really interesting talk and Lessig tied together some things you might not think would go together. If a video or Flash presentation becomes available, I would recommend watching through it. And hopefully my pessimistic predictions will be wrong; it's certainly happened before.

1 November 2007

Andrew McMillan: DAViCal and support for Apple's CalDAV client in OS 10.5

Quite a few people seem impressed with the new release of Leopard, and are now looking for a CalDAV server to use with their shiny new iCal app. Unfortunately it seems that Apple wrote this primarily to work with their own (free, open-source) calendar server, which has the side effect that it doesn't work with DAViCal. Part of the "doesn't work" is due to DAViCal not implementing some areas of the CalDAV specification, which is fair enough. Part of it is due to DAViCal not implementing some draft extensions to the CalDAV specification, which I can also understand, since it allows them to provide some useful features that those extensions are designed to support. There also seem to be some parts of the "doesn't work" which are due to a dependence on extensions beyond either of these cases, which is a little more disappointing - and quite a bit harder to implement. So far I have made some fixes to the first point, and some additions towards parts of the second, but as of today it still does not work. This is complicated by my not having access to a Mac. Things are looking up, however, because Tom Robinson has kindly agreed to loan me a Mac running Leopard from next week. In order to "clear the slate" for that, I will be releasing a 0.9.2 over the weekend with the various minor enhancements and fixes that have been applied over the last week. So although this upcoming release will let you add your DAViCal account to iCal 3, it still won't actually work with it. I'm hoping that ready access to the application will enable me to correct that fairly quickly. Also waiting in the wings (and which unfortunately won't be in 0.9.2 either) Maxime Delorme has been working on SyncML support, and is nearly ready with a patch, so we can look forward to that addition fairly soon also.

31 October 2007

Andrew McMillan: CRM114 Awesomeness

I hate spam! Which probably puts me in the same camp as 99.99999% of the world. The other 1 in 10 million are, of course, the spammers, who seem to take the space invaders approach to sending e-mail: we'll keep sending you more until you die. A few years ago I used to only receive perhaps 1 every 100 seconds, which was pretty annoying, but Spamassassin was quite able to filter out 99% of those and let through about 1-2 each day, which I could deal with. My spam levels increased to maybe 1 every 20 seconds, and late in 2005 I implemented a second layer of spam filtering on my laptop using DSpam. This worked quite effectively, but DSpam is really not the tool for the job - it's much more appropriate as a company-wide antispam solution, and potentially as a replacement for Spamassassin. It drove me nuts on my laptop because it's resource usage slowed down the interactive response. When I got my new laptop at the beginning of the year I decided against continuing with my rather baroque mail setup and to leave the spam filtering on the server. What I didn't realise is that my spam rate had increased again to around 1 every 8 seconds, and it has been slowly driving me to distraction ever since. It seems to have cranked up another notch recently, to perhaps 1 every 3 seconds now, so that 1% making its way through Spamassassin was getting to a very annoying several hundred each day. The longer I took to resolve it, the more time I would be wasting dealing with it every day. What I chose to apply on this occasion was CRM114, which I had some vague idea might be able to help. I was fairly impressed by the relatively simple install, but what completely blew me away was the speed with which it was able to learn to be useful. Starting from scratch, it seems to be correctly classifying over 90% of my incoming mail after about 12 hours of training, on a total of only 75 'Unsure' messages. Even after only an hour it was getting over 50% (I'll describe my actual CRM114 installation process in a comment below). So far there have been no false positives. Now that CRM114 is installed I will be able to look into some of it's other mail classifying features too, and I'm really looking forward to that too.

30 October 2007

Andrew McMillan: Getting Blood from a Stone

Last week I installed Ubuntu Gutsy onto Heather's laptop. While Gutsy seems to be an easy task for most situations, installing it onto a Pentium 366 laptop with 200M of RAM and (particularly) an 800x600 screen was harder than it perhaps should have been. I'm sure that most installations these days aren't 800x600, but the graphical installer in Gutsy seems determined to make this painful. I had to move the toolbars to the sides of the screen, and then I could see the top half of the buttons on each page. It was like the page was sized for 600 vertical pixels, but the designer had forgotten about toolbars and title bars - not that I could see any screens in the process I followed that needed more than 5/6 of that screen anyway. Eventually I got it installed, and it even seemed to run OK once we booted into it. That's "OK for a 200M P366 with an 800x600 screen" though. Looking around at the price of a new laptop made putting up with that sort of performance a whole lot less palatable. The Acer Aspire 5310 (with free RAM upgrade) was $898 at Dick Smith, with a $99 cashback offer. A quick google shows that it's using the Broadcom 43xx wireless which isn't even close to being the best, but can be made to work with Linux. Everything else seemed likely to work, so we bought it. Installing Gutsy on it was nearly trivial, though I had to install bcm43xx-fwcutter on a different PC (my laptop, which is running Debian, in fact) to get the firmware for the WLAN before I could get the wireless working. I'm surprised that Broadcom still don't make that firmware publicly available somewhere, rather than forcing people to jump through the sort of hoops that would get them wanting an Intel chipset next time. Anyway, everything installed very easily, and the laptop is working quite nicely. Strangely neither sound, nor suspend to ram are working out of the box, but perhaps I'll get them going in due course. They're not so important in this case fortunately, but perhaps in due course I'll try and get them working and post some details about it. Much harder has been getting the fabled 'cashback' from Acer. I think I now know what I'm being paid $99 for. Firstly the only way to get your cashback is by registering through a webpage. Heather's first attempt to do this resulted in an error from our proxy about a malformed request, so I got called in. I tried registering using on my laptop, but couldn't even get to the cashback page. I then tried using IE6, with similar results. So perhaps it's my PC? I tried using a different PC, with the same result again! We tried ringing them up, but they were absolutely determined that (even after 20 minutes on the phone) they were not going to accept that information over the phone. So the only way to get the cashback from Acer was via their thoroughly broken website. Even their Contact Acer page is broken in firefox just showing a blank. Firefox users need not apply. Eventually, while spending some time in front of Heather's main computer (which had made it all the way through to submitting their on-line form before failing) I realised that the error she was getting was a proxy error from some in-form javascript submitting an invalid request, so I disabled the proxy, the form finally worked, and I managed to apply for the cashback. Now we just have to send the printed form in, along with some blood from our firstborn, the ashes of my grandmother, various barcodes, receipts and toenail clippings and we're sweet. They say they'll send us some money within 30 days. I think we should maybe frame it or something. I just know I'm going to feel really inclined to take advantage of cashback offers in future. In Other News: DVD Slideshow Meanwhile I've been playing with DVD Slideshow which seems to be just what my parents have been after for a while, so they don't have to keep their favourite photos on the camera to be able to show them off on someone's TV. It's great! At least it is great now after I changed all the calls to ffmpeg to add a 'k' after the bitrate parameter. But that's Open Source Software, I guess. I'll send a patch to them... :-)

16 October 2007

Andrew McMillan: Release 0.9.0 of DAViCal CalDAV server

I have not been able to put a lot of effort into DAViCal over the last couple of months, since my father was diagnosed with stomach cancer in early September, and he died on 2nd October. So here it is, finally, including a lot of refactoring work around the handling of DAV/CalDAV REPORT requests and implementation of the DAV::principal-property-search report. This also requires an upgrade to the latest AWL library (0.20), which includes a complete rewrite to the class used for parsing and rendering iCalendar data. This release is recommended, since you will need some of this stuff to support the upcoming Mozilla Calendar 0.7 release properly. At this point I have only released the files to http://debian.mcmillan.net.nz/ and I'll push it out to a wider audience if I don't here screams of anguish from people in the next few days :-) This release does not have any associated database changes, so it should be a simple matter to install the upgraded code.

15 October 2007

MJ Ray: Mr Slef Goes to Westminster

This coming Thursday I will be at the Parliament and the Internet conference at Westminster. It will discuss topics including: What messages on these or other hot topics would you like to send to our lawmakers? I think I'm most likely to attend the *'d sessions, but I should have friends in others and there's often time to bend ears informally too. Discuss it on-list, off-list, in forums I read, on my web site's comments form - just let me know by Wednesday lunchtime, please.

22 September 2007

Andrew McMillan: Software of the week

This week has been a week of pictures for me. I was looking around on Wikipedia earlier in the week and decided that the articles around the Porirua area where I live was looking a little barren. So I've been out taking a few snaps to liven it up a little. The articles themselves don't look that great either so I'll see if I can't put some effort into improving them as well. As a result though, I've been playing with photographs and I'm finding that the graphics landscape on Linux has improved significantly since last time I was playing around with things. I've been using Hugin for making panoramas since someone pointed me to it at Debconf5 in Helsinki, but version 0.7beta4 with Panorama Tools 13 is an immense improvement. It seems that now if I take my photos the right way I can expect Hugin to create a panorama with only a couple of minutes of time, and no real effort to speak of. I built some Debian packages of Hugin 0.7beta4 and libpano13 which are here, if you're interested. Sadly, it seems that the current maintainer has not updated these packages for a long time. Something I've also been trying to do properly for some time is to produce decent HDR images. Although the tools have been around for a while they are somewhat inaccessible to people, taking a long time to gain any kind of intuitive understanding about how they work and what parameters a person should use. The answer, it turns out, is something like the Hugin one: provide a GUI front end. Well there now is one, and it's damn good. I've now started taking a few photos in triplicate so I can fiddle around with QtPfsGUI and learn to recognise the situations where HDR can make a better photo. It's not an addictive name, but it's great software and when I've managed to get my head around it I will create an HDR area on my gallery and put a few fun images in there. I've built some Debian packages of this also, but I've never had to deal with something using QMake before and it doesn't seem particularly friendly to packaging. Probably I'm just beating on it the wrong way. Anyway, here are some Debian QtPfsGui packages for anyone who is interested. Maybe if you know how better to drive qmake from a distribution point of view you could send me some tips. Something that I really like about both Hugin and QtPfsGui is the way that they are providing a GUI framework for some pre-existing command-line tools. The separation of UI from function is a classic application of the Unix model that will be familiar to anyone who has ever piped find into xargs, and it really does allow for the whole to be greater than the sum of the parts. What is interesting about the two above, seems to be that by providing the UI they have managed to breathe some new life into the underlying functions as well. The next thing that I'm really looking forward to is the upcoming 0.46 release of Inkscape SVG editor. This totally awesome program just gets better and better and a bunch of the upcoming enhancements come from the Google Summer of Code. Actually all of these programs are quite a bit better recently because of the sponsorship they have received from the GSoC. There have been a few Catalyst people mentoring some GSoC projects - mostly around Moodle, Lisp and Git - so a couple of the Catalyst folks will be off to the big meetup / review in a couple of weeks. I hope those lucky stiffs pass on my thanks to everyone involved.

30 August 2007

Chris Lamb: Strawman

We have a somewhat esoteric heating arrangement at my parent’s house. We burn bales of bean straw inside a large combustion chamber (similar to these) which is located in an out-house to the main building. A number of water-filled pipes run through the chamber which is then pumped around the house. Environmentally, it’s really good. Most of the straw would have been burnt anyway (the rest cut into the soil) and the combustion is fan-assisted to minimise incomplete combustion. Very little ash is produced from burning such a bale. Anyway, the delivery and storage of approximately 700 straw bales has become a yearly tradition in our house. As a guide, the trailer in the picture below is holding 160. It’s pretty tough going: even individually these things weigh a considerable amount, and they require removing from the trailer and then stacking in a 3 metre pile inside another shed! You quickly adopt the rules of moving a bale as little as possible vertically (carry them on your back if you’re young enough!), and never, ever, moving a bale twice. There’s also a large amount of dust, and a geek’s delicate palms quickly tire of the string holding the bales together. Anyway, this happened today on a couple of hours notice. The other two males in the house decided to be elsewhere *cough*, leaving it up to me and my poor mum. If anyone was expecting email/code/chat/info today.. sorry.

1 July 2007

Biella Coleman: The Worst of the Web: Punditry 2.0

Tonight, instead of minding my dinner, which did burn, I was drawn into and extended a pretty fiery IRC conversation on debian-devel on a topic that does not like to die: the merits and demerits of Wikipedia. It is not worth summarizing the conversation here for it followed a pretty predictable arc. There were those who thought Wikipedia was novel and valuable, others who saw it as a pit of bad facts, and inaccuracies and a few others who saw it in ways negative and positive. I found the conversation somewhat ironic, because I usually find myself defending free software to outsiders much in the same way I was defending Wikipedia to free software developers. I tend to be in the camp of admirers, and for many reasons, although, of course, I also was arguing that it is too early to judge the value of Wikipedia as it is in its infancy. Like Debian, since Wikipedia is an institution that has changed *a lot* in its short history, it is hard to make any hard and fast conclusions about its worth, impact, etc, although more modest and qualified claims are certainly in order. The only reason I feel like I can argue anything about Wikipedia is because I am currently reading a dissertation on Wikipedia by Joseph Reagle. He not only has really insightful things to say about the collaborative culture driving the online encyclopedia, but also about the prolific commentary that has closely followed the heels of Wikipedia in the last few years. Just today, he wrote a blog entry entitled, Punditry and The Web 2.0 debate, which so hit the nail on the head on the problems–not with Wikipedia–but with the peanut gallery (commentary) on Wikipedia. As he notes, the problem is often not with the so called correct or incorrect judgments on Wikipedia (or other Web 2.0 phenomenon) but with the very debates themselves, because many of them are built on a shaky foundation of sand, but this punditry, as Joe rightly calls it, is nonetheless worthy of critical examination:
.. while I follow the discussion with interest, I actually don’t find it substantively engaging. Many of the arguments, particularly Gorman’s, tend to be characterized by unsubstantiated claims and the purposeful construal of nuanced issues as extremes — propping up strawmen for subsequent potshots. As I’ve already indicated, while it might bring pundits a sense of righteousness and attention, in the end “Time, not arguments, will utlimately tell.” (And, for this reason I appreciate Larry Sanger’s continuing efforts to implement his vision.)Why, then, do I find this discussion of interest? Punditry, communicative disorders, and history. First, I’m trying to come to an understanding of “punditry,” and I think Gorman’s recent bloggings is an exemplar. My sense is that sometimes people argue for arguments’ sake. That is, even if they genuinely believe the thing they are arguing for, attention, not persuasion, is the goal. (In a sense, perhaps it is a high-brow, and perhaps more genuinely held, form of trolling — another interesting phenomenon.)
While punditry has always existed, there is no doubt that the Internet has accentuated and facilitated this form of (often male) communication and it is great to see someone tackle this topic. Because let’s face it, there is a lot more “garbage” spewing from Web 2.0 or Wikipedia commentary compared to than the actual Wikipedia articles themselves.

26 May 2007

David Welton: Airlines and rand()

Ilenia and I are looking for tickets to go "home" (Eugene, Oregon) this summer. We've had pretty good luck with Lufthansa in the past (as opposed to Air France, which lost a huge bag of my stuff), so we turned to them first to look for flights. I like the page they have for prices/dates, which gives you a nice way to look around for a better price without stabbing randomly at dates: Big long Lufthansa URL However, the maddening thing is that the prices change frequently. Not every day, but often after even 5 minutes! They bounce around up to 100 euro at a time. I understand the theory behind price descrimination, but this is the classic case where the customer ends up feeling like they're being made fun of and goes elsewhere. United, in our case, which offers a cheaper price for the same plane (they're Lufthansa's partner and thus share flights).

6 May 2007

Andrew McMillan: New eaccelerator packages for Debian Etch

My packages of eaccelerator for Etch caused me a problem last night when I pulled in a new security fix for PHP5 and they all stopped working so I've built some new ones against the new PHP5. I've also added a few enhancements: The packages are available in my Debian repository built for i386 and for amd64, with a apt sources line like: deb http://debian.mcmillan.net.nz/debian etch awm The repository is signed with my private key, 0x8f068012, which is fairly well-connected and which you can get from subkeys.pgp.net and many other places. Enjoy!

10 April 2007

Andrew McMillan: Moved to a new (old) server

I've now moved lots of things to a new (old) server which should be noticeably faster than the old (old) 500MHz PIII-based server. Of course since Etch was released I'm now running on Etch. This also meant an upgrade to PHP 5.2, so I spent some time getting bitten by this bug with Drupal 4.7.4 and PHP 5.2 and had to upgrade to 4.7.6. Consequently there has been a wee bit of bouncing up and down today... In the process I have also upgraded my packages for eAccelerator so that they would work with Etch. Since Andrew Hutchings had taken them off me and played a bit further I have now taken his packages and built some newer ones, including a patch for 0.9.5 to work with PHP 5.2, and an additional build dependency. My new (old) server is still 32-bit, so these packages are only built for 32-bit as yet. I'm currently trying to find a 64-bit Etch box I can build them on as well, but everyone at work seems to have gone home. You should be able to fetch them from my package repository by adding it to your sources.list with:

deb http://debian.mcmillan.net.nz/debian etch awm
Naturally that repository is signed with my public key (8f068012), which is in turn signed by lots of people in the Debian community and should be readily identifiable - not that that implies trust, of course :-). Also, since my new (old) server is in a different location (and to ease the transfer) everything is on a new IP address (and a new IPv6 address). Hopefully the system's accessibility over IPv6 on http://ipv6.geek.nz/ will be slowly improving as we incrementally improve our IPv6 connectivity with the rest of the world.

10 November 2006

Daniel Stone: ripping on ubuntu is the new black

One of the annoying features of reading several planets, is that you end up reading entries from people you otherwise respect, who seem to have various complexes about nonsensical issues. We all saw the month in which half of Debian decided it was cool to rip on Ubuntu, and now apparently it's the new cool thing to do if you work on Fedora.

According to a particularly obnoxious entry, the quality or otherwise of Ubuntu's kernel[0] does 'damage to the free software community'. Now, they've focussed on binary drivers. Which is fair enough: I am personally quite uncomfortable with binary drivers, and extremely unimpressed with Ubuntu's recent move towards binary drivers by default. While I respect the fact that they still encourage users to install free drivers, providing them by default is not a winning move, in my opinion. But I'm not going to discuss this further right now, because if I do, the terrorists win.

The reason the terrorists win, is because some people like to furiously handwave and make strawmen. Apparently now, everyone who's ever touched Ubuntu is rabidly for binary drivers, but the problem here is the handwave from criticism of Fedora's alleged community into yet another ramble about the kernel tree and binary drivers.

I certainly hope the Fedora community is more mature than this, both in general, and in their ability to have a constructive debate about the level of its community involvement. Right now, it is not even disputed that there is effectively none. At first glance, they may appear similar, but if you look further, the similarities vanish. Anyone can come in and get involved in the decision making in Ubuntu, but more important than that, the process is at least completely transparent, so you can see what's going on, on public lists, at every step. Fedora does not have this: decisions magically appear On High from RHEL types. This is not necessarily a bad thing, and does not necessarily result in a lower-quality distribution, but if everyone could just accept this as being true and move on, rather than bitterly baiting a successful community distribution about it, I think everyone would be a lot happier.

The biggest difference is -- yes, basically all the Ubuntu core developers are employed by Canonical. These people were handpicked from the Debian community originally, and now a great deal of them start as Ubuntu community members, and are employed by Canonical because having them work full-time on this stuff would be awesome. The process is still entirely transparent: being employed is a mark of recognition (sorry), and a request to continue working on this even more, rather than a pre-requisite to being able to do anything. Witness Fedora Core vs. Fedora Extras, for example: there are quite a few packages that non-Red Hat employees are prevented from meaningfully contributing to.

So, the summary in a nutshell: binary drivers bad (and Ubuntu moves towards promoting same even worse), Fedora 'community' not at all, handwaves bad, refusal to get involved with a meaningful debate about community governance and structure without engaging in petty mudslinging and pointless handwaves inexcusable.

(Disclaimer: I am not involved with any distributions, nor do I have any professional affiliation with them either. I'm just an irritated observer.)

[0]: I don't claim to have any insider knowledge here; the only point at which its quality or otherwise has become an issue for me lately, is a random hang in gettimeofday() in a corner case recently, which I've been too lazy to isolate and report as a proper bug; I worked around it in the X server anyway.

Next.

Previous.